Belief perseverance (also known as conceptual conservatism) is maintenance of a belief despite new information that firmly contradicts it.
Since rationality involves conceptual flexibility, belief perseverance is consistent with the view that human beings act at times in an irrational manner. Philosopher F.C.S. Schiller holds that belief perseverance "deserves to rank among the fundamental 'laws' of nature".
If beliefs are strengthened after others attempt to present evidence debunker them, this is known as a backfire effect. There are psychological mechanisms by which backfire effects could potentially occur, but the evidence on this topic is mixed, and backfire effects are very rare in practice.
A 2020 review of the scientific literature on backfire effects found that there have been widespread failures to replicate their existence, even under conditions that theoretically would be favorable to observing them. Due to the lack of reproducibility, most researchers believe that backfire effects either are unlikely to occur on the broader population level, or only occur in very specific circumstances, or do not exist.
For most people, corrections and fact-checking are very unlikely to have a negative effect, and there is no specific group of people in which backfire effects have been consistently observed.
The first study of belief perseverance was carried out by Leon Festinger, Riecken, and Schachter. These psychiatrists spent time with members of a doomsday cult who believed the world would end on December 21, 1954. Despite the failure of the forecast, most believers continued to adhere to their faith. In (1956) and A Theory of Cognitive Dissonance (1957), Festinger proposed that human beings strive for internal psychological consistency to function mentally in the Reality. A person who experiences internal inconsistency tends to become psychologically uncomfortable and is motivated to reduce the cognitive dissonance.Festinger, L. (1957). A Theory of Cognitive Dissonance. California: Stanford University Press. They tend to make changes to justify the stressful behavior, either by adding new parts to the cognition causing the psychological dissonance (rationalization) or by avoiding circumstances and contradictory information likely to increase the magnitude of the cognitive dissonance (confirmation bias).
When asked to reappraise probability estimates in light of new information, subjects displayed a marked tendency to give insufficient weight to the new evidence. They refused to acknowledge the inaccurate prediction as a reflection of the overall validity of their faith. In some cases, subjects reported having a stronger faith in their religion than before.
In a separate study, mathematically capable teenagers and adults were given seven arithmetical problems and asked to estimate approximate solutions using manual estimating. Then, using a calculator rigged to provide increasingly erroneous figures, they were asked for accurate answers (e.g., yielding 252 × 1.2 = 452.4, when it is actually 302.4). About half of the participants went through all seven tasks while commenting on their estimating abilities or tactics, never letting go of the belief that calculators are infallible. They simply refused to admit that their previous assumptions about calculators could have been incorrect.
Lee Ross and Craig A. Anderson led some subjects to the false belief that there existed a positive correlation between a firefighter's stated preference for taking risks and their occupational performance. Other subjects were told that the correlation was negative. The participants were then thoroughly debriefed and informed that there was no link between risk taking and performance. These authors found that post-debriefing interviews pointed to significant levels of belief perseverance.
In another study, subjects spent about four hours following instructions of a hands-on instructional manual. At a certain point, the manual introduced a formula which led them to believe that were 50 percent larger than they are. Subjects were then given an actual sphere and asked to determine its volume; first by using the formula, and then by filling the sphere with water, transferring the water to a box, and directly measuring the volume of the water in the box. In the last experiment in this series, all 19 subjects held a Ph.D. degree in a natural science, were employed as researchers or professors at two major universities, and carried out the comparison between the two volume measurements a second time with a larger sphere. All but one of these scientists clung to the spurious formula despite their empirical observations.
There are psychological mechanisms by which backfire effects could potentially occur, but the evidence on this topic is mixed, and backfire effects are very rare in practice. A 2020 review of the scientific literature on backfire effects found that there have been widespread failures to replicate their existence, even under conditions that would be theoretically favorable to observing them. Due to the lack of reproducibility, most researchers believe that backfire effects are either unlikely to occur on the broader population level, or they only occur in very specific circumstances, or they do not exist. Brendan Nyhan, one of the researchers who initially proposed the occurrence of backfire effects, wrote in 2021 that the persistence of misinformation is most likely due to other factors.
For most people, corrections and fact-checking are very unlikely to have a negative impact, and there is no specific group of people in which backfire effects have been consistently observed. Presenting people with factual corrections has been demonstrated to have a positive effect in many circumstances. For example, this has been studied in the case of informing believers in 9/11 conspiracy theories about statements by actual experts and witnesses. One possibility is that criticism is most likely to backfire if it challenges someone's worldview or identity. This suggests that an effective approach may be to provide criticism while avoiding such challenges.
In many cases, when backfire effects have been discussed by the media or by bloggers, they have been over-generalized from studies on specific subgroups to incorrectly conclude that backfire effects apply to the entire population and to all attempts at correction.
Belief persistence is frequently accompanied by intrapersonal cognitive processes. "When the decisive facts did at length obtrude themselves upon my notice," wrote the chemist Joseph Priestley, "it was very slowly, and with great hesitation, that I yielded to the evidence of my senses."
Peter Marris suggests that the process of abandoning a conviction is similar to the working out of grief. "The impulse to defend the predictability of life is a fundamental and universal principle of human psychology." Human beings possess "a deep-rooted and insistent need for continuity".
Philosopher of science Thomas Kuhn points to the resemblance between conceptual change and Gestalt perceptual shifts (e.g., the difficulty encountered in seeing the hag as a young lady). Hence, the difficulty of switching from one conviction to another could be traced to the difficulty of rearranging one's perceptual or cognitive field.
|
|